Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 29
Filtrar
1.
Am J Epidemiol ; 193(2): 267-276, 2024 Feb 05.
Artículo en Inglés | MEDLINE | ID: mdl-37715454

RESUMEN

Estimates of excess mortality can provide insight into direct and indirect impacts of the coronavirus disease 2019 (COVID-19) pandemic beyond deaths specifically attributed to COVID-19. We analyzed death certificate data from Baltimore City, Maryland, from March 1, 2020, to March 31, 2021, and found that 1,725 individuals (95% confidence interval: 1,495, 1,954) died in excess of what was expected from all-cause mortality trends in 2016-2019; 1,050 (61%) excess deaths were attributed to COVID-19. Observed mortality was 23%-32% higher than expected among individuals aged 50 years and older. Non-White residents of Baltimore City also experienced 2 to 3 times higher rates of excess mortality than White residents (e.g., 37.4 vs. 10.7 excess deaths per 10,000 population among Black residents vs. White residents). There was little to no observed excess mortality among residents of hospice, long-term care, and nursing home facilities, despite accounting for nearly 30% (312/1,050) of recorded COVID-19 deaths. There was significant geographic variation in excess mortality within the city, largely following racial population distributions. These results demonstrate the substantial and unequal impact of the COVID-19 pandemic on Baltimore City residents and the importance of building robust, timely surveillance systems to track disparities and inform targeted strategies to remediate the impact of future epidemics.


Asunto(s)
COVID-19 , Humanos , Persona de Mediana Edad , Anciano , Pandemias , Baltimore/epidemiología , Población Negra , Demografía , Mortalidad
2.
Inj Prev ; 29(1): 85-90, 2023 02.
Artículo en Inglés | MEDLINE | ID: mdl-36301795

RESUMEN

Introduction Non-fatal shooting rates vary tremendously within cities in the USA. Factors related to structural racism (both historical and contemporary) could help explain differences in non-fatal shooting rates at the neighbourhood level. Most research assessing the relationship between structural racism and firearm violence only includes one dimension of structural racism. Our study uses an intersectional approach to examine how the interaction of two forms of structural racism is associated with spatial non-fatal shooting disparities in Baltimore, Maryland. Methods We present three additive interaction measures to describe the relationship between historical redlining and contemporary racialized economic segregation on neighbourhood-level non-fatal shootings. Results Our findings revealed that sustained disadvantage census tracts (tracts that experience contemporary socioeconomic disadvantage and were historically redlined) have the highest burden of non-fatal shootings. Sustained disadvantage tracts had on average 24 more non-fatal shootings a year per 10 000 residents compared with similarly populated sustained advantage tracts (tracts that experience contemporary socioeconomic advantage and were not historically redlined). Moreover, we found that between 2015 and 2019, the interaction between redlining and racialized economic segregation explained over one-third of non-fatal shootings (approximately 650 shootings) in sustained disadvantage tracts. Conclusion These findings suggest that the intersection of historical and contemporary structural racism is a fundamental cause of firearm violence inequities in Baltimore. Intersectionality can advance injury prevention research and practice by (1) serving as an analytical tool to expose inequities in injury-related outcomes and (2) informing the development and implementation of injury prevention interventions and policies that prioritise health equity and racial justice.


Asunto(s)
Armas de Fuego , Racismo Sistemático , Humanos , Baltimore/epidemiología , Marco Interseccional , Características de la Residencia
3.
Front Epidemiol ; 3: 1128501, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-38455887

RESUMEN

Epidemiologic investigations of extreme precipitation events (EPEs) often rely on observations from the nearest weather station to represent individuals' exposures, and due to structural factors that determine the siting of weather stations, levels of measurement error and misclassification bias may differ by race, class, and other measures of social vulnerability. Gridded climate datasets provide higher spatial resolution that may improve measurement error and misclassification bias. However, similarities in the ability to identify EPEs among these types of datasets have not been explored. In this study, we characterize the overall and temporal patterns of agreement among three commonly used meteorological data sources in their identification of EPEs in all census tracts and counties in the conterminous United States over the 1991-2020 U.S. Climate Normals period and evaluate the association between sociodemographic characteristics with agreement in EPE identification. Daily precipitation measurements from weather stations in the Global Historical Climatology Network (GHCN) and gridded precipitation estimates from the Parameter-elevation Relationships on Independent Slopes Model (PRISM) and the North American Land Data Assimilation System (NLDAS) were compared in their ability to identify EPEs defined as the top 1% of precipitation events or daily precipitation >1 inch. Agreement among these datasets is fair to moderate from 1991 to 2020. There are spatial and temporal differences in the levels of agreement between ground stations and gridded climate datasets in their detection of EPEs in the United States from 1991 to 2020. Spatial variation in agreement is most strongly related to a location's proximity to the nearest ground station, with areas furthest from a ground station demonstrating the lowest levels of agreement. These areas have lower socioeconomic status, a higher proportion of Native American population, and higher social vulnerability index scores. The addition of ground stations in these areas may increase agreement, and future studies intending to use these or similar data sources should be aware of the limitations, biases, and potential for differential misclassification of exposure to EPEs. Most importantly, vulnerable populations should be engaged to determine their priorities for enhanced surveillance of climate-based threats so that community-identified needs are met by any future improvements in data quality.

4.
Artículo en Inglés | MEDLINE | ID: mdl-35010425

RESUMEN

Extreme precipitation events (EPE) change the natural and built environments and alter human behavior in ways that facilitate infectious disease transmission. EPEs are expected with high confidence to increase in frequency and are thus of great public health importance. This scoping review seeks to summarize the mechanisms and severity of impacts of EPEs on infectious diseases, to provide a conceptual framework for the influence of EPEs on infectious respiratory diseases, and to define areas of future study currently lacking in this field. The effects of EPEs are well-studied with respect to enteric, vector-borne, and allergic illness where they are shown to moderately increase risk of illness, but not well-understood in relation to infectious respiratory illness. We propose a framework for a similar influence of EPEs on infectious respiratory viruses through several plausible pathways: decreased UV radiation, increased ambient relative humidity, and changes to human behavior (increased time indoors and use of heating and cooling systems). However, limited work has evaluated meteorologic risk factors for infectious respiratory diseases. Future research is needed to evaluate the effects of EPEs on infectious respiratory diseases using individual-level case surveillance, fine spatial scales, and lag periods suited to the incubation periods of the disease under study, as well as a full characterization of susceptible, vulnerable, and sensitive population characteristics.


Asunto(s)
Enfermedades Transmisibles , Virus , Animales , Cambio Climático , Vectores de Enfermedades , Humanos , Salud Pública
5.
Environ Res ; 185: 109384, 2020 06.
Artículo en Inglés | MEDLINE | ID: mdl-32240840

RESUMEN

BACKGROUND: Hurricane Katrina made landfall in New Orleans, Louisiana as a Category 3 storm in August 2005. Storm surges, levee failures, and the low-lying nature of New Orleans led to widespread flooding, damage to over 70% of occupied housing, and evacuation of 80-90% of city residents. Only 57% of the city's black population has returned. Many residents complain of gentrification following rebuilding efforts. Climate gentrification is a recently described phenomenon whereby the effects of climate change, most notably rising sea levels and more frequent flooding and storm surges, alter housing values in a way that leads to gentrification. OBJECTIVE: To examine the climate gentrification following hurricane Katrina by (1) estimating the associations between flooding severity, ground elevation, and gentrification and (2) whether these relationships are modified by neighborhood level pre- and post-storm sociodemographic factors. METHODS: Lidar data collected in 2002 were used to determine elevation. Water gauge height of Lake Ponchartrain was used to estimate flood depth. Using census tracts as a proxy for neighborhoods, demographic, housing, and economic data from the 2000 decennial census and the 2010 and 2015 American Community Survey 5-year estimates US Census records were used to determine census tracts considered eligible for gentrification (median income < 2000 Orleans Parish median income). A gentrification index was created using tract changes in education level, population above the poverty limit, and median household income. Proportional odds ordinal logistic regression was used with product terms to test for effect measure modification by sociodemographic factors. RESULTS: Census tracts eligible for gentrification in 2000 were 80.2% black. Median census tract flood depth was significantly lower in areas eligible to undergo gentrification (0.70 m vs. 1.03 m). Residents of gentrification-eligible tracts in 2000 were significantly more likely to be black, less educated, lower income, unemployed, and rent their home rather than own. In 2015 in these same eligible tracts, areas that underwent gentrification became significantly whiter, more educated, higher income, less unemployed, and more likely to live in a multi-unit dwelling. Gentrification was inversely associated with flood depth and directly associated with ground elevation in eligible tracts. Marginal effect modification was detected by the effect of pre-storm black race on the relationships of flood depth and elevation with gentrification. CONCLUSIONS: Gentrification was strongly associated with higher ground elevation in New Orleans. These results provide evidence to support the idea of climate gentrification described in other low-elevation major metropolitan areas like Miami, FL. High elevation, low-income, demographically transitional areas in particular - that is areas that more closely resemble high-income area demographics, may be vulnerable to future climate gentrification.


Asunto(s)
Tormentas Ciclónicas , Ciudades , Humanos , Louisiana , Nueva Orleans , Análisis Espacial
6.
Am J Sports Med ; 47(5): 1096-1102, 2019 04.
Artículo en Inglés | MEDLINE | ID: mdl-30943085

RESUMEN

BACKGROUND: There has been a renewed interest in ulnar collateral ligament (UCL) repair in overhead athletes because of a greater understanding of UCL injuries, an improvement in fixation technology, and the extensive rehabilitation time to return to play. PURPOSE/HYPOTHESIS: To evaluate the clinical outcomes of a novel technique of UCL repair with internal brace augmentation in overhead throwers. STUDY DESIGN: Case series; Level of evidence, 4. METHODS: Patients undergoing a novel technique of UCL repair with internal brace augmentation were prospectively followed for a minimum of 1 year. Potential candidates for repair were selected after the failure of nonoperative treatment when imaging suggested a complete or partial avulsion of the UCL from either the sublime tubercle or medial epicondyle, without evidence of poor tissue quality of the ligament. The final decision on UCL repair or traditional reconstruction was determined intraoperatively. Demographic and operative data were collected at the time of surgery. Return to play, and Kerlan-Jobe Orthopaedic Clinic (KJOC) scores were collected at 1 year and then again at 2 years postoperatively. RESULTS: Of the 111 overhead athletes eligible for the study, 92% (102/111) of those who desired to return to the same or higher level of competition were able to do so at a mean time of 6.7 months. These patients had a mean KJOC score of 88.2 at final follow-up. CONCLUSION: UCL repair with internal brace augmentation is a viable option for amateur overhead throwers with selected UCL injuries who wish to return to sport in a shorter time frame than allowed by traditional UCL reconstruction.


Asunto(s)
Traumatismos en Atletas/cirugía , Tirantes , Ligamento Colateral Cubital/lesiones , Cinta Quirúrgica , Reconstrucción del Ligamento Colateral Cubital/instrumentación , Reconstrucción del Ligamento Colateral Cubital/métodos , Adolescente , Béisbol/lesiones , Colágeno , Femenino , Estudios de Seguimiento , Humanos , Masculino , Estudios Prospectivos , Adulto Joven
7.
J Athl Train ; 54(3): 296-301, 2019 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-30721094

RESUMEN

CONTEXT: High loads in the elbow during baseball pitching can lead to serious injuries, including injuries to the ulnar collateral ligament. These injuries have substantial implications for individual pitchers and their teams, especially at the professional level of competition. With a trend toward increased ball velocity in professional baseball, controversy still exists regarding the strength of the relationship between ball velocity and elbow-varus torque. OBJECTIVE: To examine the relationship between fastball velocity and elbow-varus torque in professional pitchers using between- and within-subjects statistical analyses. DESIGN: Cross-sectional study. SETTING: Motion-analysis laboratory. PATIENTS OR OTHER PARTICIPANTS: Using the previously collected biomechanical data of 452 professional baseball pitchers, we performed a retrospective analysis of the 64 pitchers (52 right-hand dominant, 12 left-hand dominant; age = 21.8 ± 2.0 years, height = 1.90 ± 0.05 m, mass = 94.6 ± 7.8 kg) with fastball velocity distributions that enabled between- and within-subjects statistical analyses. MAIN OUTCOME MEASURE(S): We measured ball velocity using a radar gun and 3-dimensional motion data using a 12-camera automated motion-capture system sampling at 240 Hz. We calculated elbow-varus torque using inverse-dynamics techniques and then analyzed the relationship between ball velocity and elbow torque using both a simple linear regression model and a mixed linear model with random intercepts. RESULTS: The between-subjects analyses displayed a weak positive association between ball velocity and elbow-varus torque (R2 = 0.076, P = .03). The within-subjects analyses showed a considerably stronger positive association (R2 = 0.957, P < .001). CONCLUSIONS: When comparing 2 professional baseball pitchers, higher velocity may not necessarily indicate higher elbow-varus torque due to the confounding effects of pitcher-specific differences (eg, detailed anthropometrics and pitching mechanics). However, within an individual pitcher, higher ball velocity was strongly associated with higher elbow-varus torque.


Asunto(s)
Traumatismos en Atletas , Béisbol/lesiones , Ligamento Colateral Cubital , Lesiones de Codo , Articulación del Codo , Antropometría/métodos , Traumatismos en Atletas/etiología , Traumatismos en Atletas/fisiopatología , Traumatismos en Atletas/prevención & control , Fenómenos Biomecánicos , Ligamento Colateral Cubital/lesiones , Ligamento Colateral Cubital/fisiopatología , Estudios Transversales , Articulación del Codo/fisiopatología , Humanos , Masculino , Estudios Retrospectivos , Rotación , Torque , Adulto Joven
8.
Sports Biomech ; 18(4): 448-455, 2019 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-29562832

RESUMEN

While 10% of the general population is left-handed, 27% of professional baseball pitchers are left-handed. Biomechanical differences between left- and right-handed college pitchers have been previously reported, but these differences have yet to be examined at the professional level. Therefore, the purpose of this study was to compare pitching biomechanics between left- and right-handed professional pitchers. It was hypothesised that there would be significant kinematic and kinetic differences between these two groups. Pitching biomechanics were collected on 96 left-handed pitchers and a group of 96 right-handed pitchers matched for age, height, mass and ball velocity. Student t-tests were used to identify kinematic and kinetic differences (p < 0.05). Of the 31 variables tested, only four were found to be significantly different between the groups. Landing position of the stride foot, trunk separation at foot contact, maximum shoulder external rotation and trunk forward tilt at ball release were all significantly greater in right-handed pitchers. The magnitude of the statistical differences found were small and not consistent with differences in the two previous, smaller studies. Thus, the differences found may be of minimal practical significance and mechanics can be taught the same to all pitchers, regardless of throwing hand.


Asunto(s)
Béisbol/fisiología , Lateralidad Funcional/fisiología , Destreza Motora/fisiología , Brazo/fisiología , Fenómenos Biomecánicos , Codo/fisiología , Pie/fisiología , Humanos , Masculino , Estudios Retrospectivos , Rotación , Hombro/fisiología , Análisis y Desempeño de Tareas , Torso/fisiología
9.
Sports Health ; 10(5): 469, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-30153101
10.
Orthop J Sports Med ; 6(4): 2325967118764657, 2018 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-29687011

RESUMEN

BACKGROUND: Recent reports have highlighted the progressive increase in the incidence of ulnar collateral ligament (UCL) injuries to the elbow in baseball players of all levels. However, knowledge of the incidence and other epidemiological factors regarding UCL injuries, specifically in college baseball players, is currently lacking. PURPOSE: To evaluate, over a period of 1 year, the incidence of UCL injuries requiring surgery in National Collegiate Athletic Association (NCAA) Division I baseball programs. STUDY DESIGN: Descriptive epidemiology study. METHODS: A total of 155 Division I collegiate baseball programs agreed to participate in the study. Demographics (position, year, background [location of high school]) for all players on these rosters were obtained from public websites. At the conclusion of the 2017 collegiate baseball season, the athletic trainer for each program entered anonymous, detailed information on injured players through an electronic survey into a secured database. RESULTS: All 155 teams enrolled in the study completed the electronic survey. Of the 5295 collegiate baseball players on these rosters, 134 underwent surgery for an injured UCL (2.5% of all eligible athletes), resulting in a team surgery rate of 0.86 per program for 1 year. These 134 players came from 88 teams, thus 56.8% of the study teams underwent at least 1 surgery during the year. The surgery rate was 2.5 per 100 player-seasons for all players and was significantly higher among pitchers (4.4/100 player-seasons) than nonpitchers (0.7/100 player-seasons). The surgery rate was also significantly higher in underclassmen (3.1/100 player-seasons among freshmen and sophomores) than upperclassmen (1.9/100 player-seasons among juniors and seniors) (incidence rate ratio, 1.7; 95% CI, 1.1-2.4). Players from traditionally warm-weather states did not undergo UCL surgery at a significantly different rate from players from traditionally cold-weather states (2.7/100 player-seasons vs 2.1/100 player-seasons, respectively). Nearly half of surgeries (48.5%) were performed during the baseball season. CONCLUSION: The incidence of UCL surgeries in NCAA Division I collegiate baseball players represents substantial morbidity to this young athletic population. Risk factors for injuries requiring surgery include being a pitcher and an underclassman. Awareness of these factors should be considered in injury prevention programs. Furthermore, this initial study can serve as a foundation for tracking these surgical injuries in future years and then identifying trends over time.

11.
Sports Biomech ; 17(3): 314-321, 2018 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-28743205

RESUMEN

The purpose of this study was to determine how often flaws in pitching mechanics identified from biomechanical analysis are corrected. The biomechanics of 46 baseball pitchers were evaluated twice, with an average of 12 months (range 2-48 months) between evaluations. Pitchers were healthy at the time of both evaluations, competing at the high school, college, minor league or Major League level. After warming up, each participant pitched 10 full-effort fastballs. Automated three-dimensional motion analysis was used to compute eight kinematic parameters which were compared with a database of elite professional pitchers. Flaws-defined as deviations from the elite range-were explained to each participant or coach after his initial evaluation. Data from the second evaluation revealed that 44% of all flaws had been corrected. Flaws at the instant of foot contact (stride length, front foot position, shoulder external rotation, shoulder abduction, elbow flexion) or slightly after foot contact (time between pelvis rotation and upper trunk rotation) seemed to be corrected more often than flaws near the time of ball release (knee extension and shoulder abduction). Future research may determine which level athletes or which training methods are most effective for correcting flaws.


Asunto(s)
Béisbol/fisiología , Retroalimentación , Destreza Motora/fisiología , Adolescente , Rendimiento Atlético/fisiología , Fenómenos Biomecánicos , Humanos , Extremidad Inferior/fisiología , Masculino , Movimiento , Estudios Retrospectivos , Estudios de Tiempo y Movimiento , Grabación en Video , Adulto Joven
12.
Am J Sports Med ; 46(1): 44-51, 2018 01.
Artículo en Inglés | MEDLINE | ID: mdl-28968146

RESUMEN

BACKGROUND: Pitching biomechanics are associated with performance and risk of injury in baseball. Previous studies have identified biomechanical differences between youth and adult pitchers but have not investigated changes within individual young pitchers as they mature. HYPOTHESIS: Pitching kinematics and kinetics will change significantly during a youth pitcher's career. STUDY DESIGN: Descriptive laboratory study. METHODS: Pitching biomechanics were captured in an indoor laboratory with a 12-camera, 240-Hz motion analysis system for 51 youth pitchers who were in their first season of organized baseball with pitching. Each participant was retested annually for the next 6 years or until he was no longer pitching. Thirty kinematic and kinetic parameters were computed and averaged for 10 fastballs thrown by each player. Data were statistically analyzed for the 35 participants who were tested at least 3 times. Within-participant changes for each kinematic and kinetic parameter were tested by use of a mixed linear model with random effects ( P < .05). Least squares means for sequential ages were compared via Tukey's honestly significant difference test ( P < .05). RESULTS: Three kinematic parameters that occur at the instant of foot contact-stride length, lead foot placement to the closed side, and trunk separation-increased with age. With age, shoulder external rotation at foot contact decreased while maximum shoulder external rotation increased. Shoulder and elbow forces and torques increased significantly with age. Year-to-year changes were most significant between 9 and 13 years of age for kinematics and between 13 and 15 years for normalized kinetics (ie, scaled by bodyweight and height). CONCLUSION: During their first few years, youth pitchers improve their kinematics. Elbow and shoulder kinetics increase with time, particularly after age 13. Thus, prepubescent pitchers may work with their coaches to improve the motions and flexibility of the players' bodies and the paths of their arms. Once proper mechanics are developed, adolescent pitchers can focus more on improving strength and power.


Asunto(s)
Béisbol/fisiología , Articulación del Codo/fisiología , Articulación del Hombro/fisiología , Adolescente , Fenómenos Biomecánicos , Niño , Pie , Humanos , Estudios Longitudinales , Masculino , Rotación , Torque , Torso
13.
Am J Sports Med ; 46(1): 109-115, 2018 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-28942657

RESUMEN

BACKGROUND: Few studies have documented the outcomes of superior labral anterior-posterior (SLAP) repairs in baseball players. Furthermore, the results of these previous studies varied widely and were based on small numbers of patients. Hypothesis/Purpose: The purpose was to report return-to-play (RTP) rates and validated subjective outcome scores for baseball players after SLAP repair. It was hypothesized that RTP rates and outcomes would be significantly different between pitchers and nonpitchers, as well as among baseball levels. STUDY DESIGN: Case series; Level of evidence, 4. METHODS: A series of 216 baseball players was identified who had isolated SLAP repair or SLAP repair with debridement of partial-thickness (<25%) rotator cuff tear at our surgical centers. Patients were contacted by phone a minimum of 2 years after surgery and asked questions about their ability to RTP. Patients were also asked questions to complete the Western Ontario Shoulder Instability Index (WOSI), Veteran's RAND 12-Item Health Survey (VR-12), and Kerlan-Jobe Orthopaedic Clinic (KJOC) questionnaires. Statistical equivalence in RTP rate, VR-12, and WOSI scores was determined between players with and without concomitant rotator cuff debridement using 2 one-sided tests and risk difference measures. Differences in RTP were tested among baseball levels (high school, college, professional) and positions (pitcher vs nonpitcher) using chi-square analyses ( P < .05). Differences in outcomes scores were compared using t tests and analyses of variance ( P < .05). RESULTS: Of the 216 baseball players, 133 were reached by phone for follow-up interview (mean, 78 months; range, 27-146 months). Overall, 62% successfully returned to play. There were no differences in RTP rates or subjective outcomes among baseball levels or between procedures. RTP rates were 59% for pitchers and 76% for nonpitchers ( P = .060). Subjectively, the percentage of patients who felt the same or better at follow-up compared to preinjury was significantly higher among nonpitchers (66%) than pitchers (43%). There was no difference in KJOC scores between the pitchers (75.3 ± 19.4) and nonpitchers (76.2 ± 17.4) who successfully returned to play, although these scores were well below the minimum desired score of 90 for healthy baseball players. CONCLUSION: SLAP repair should continue to be considered as an option for SLAP tear treatment only after nonsurgical management has failed. Some players may be able to return to baseball after SLAP repair, although regaining preinjury health and performance is challenging.


Asunto(s)
Traumatismos en Atletas/cirugía , Béisbol/lesiones , Desbridamiento , Volver al Deporte , Lesiones del Manguito de los Rotadores/cirugía , Adolescente , Adulto , Niño , Humanos , Masculino , Ontario , Manguito de los Rotadores/cirugía , Articulación del Hombro/cirugía , Adulto Joven
14.
Am J Sports Med ; 45(8): 1815-1821, 2017 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-28467122

RESUMEN

BACKGROUND: Anterior cruciate ligament (ACL) injuries occur commonly in football. Recent work has reported ACL reconstruction (ACLR) as one of several orthopaedic procedures with unfavorable outcomes for professional athletes. The performance impact to defensive players after surgery has not been quantified. PURPOSE: To quantify the effect of ACLR on the performance of defensive players by comparing them to a cohort of matched controls as well as to measure the effect of ACLR on athletes' career length in the National Football League (NFL). STUDY DESIGN: Case-control and cohort study; Level of evidence, 3. METHODS: Thirty-eight NFL defensive players with a history of ACLR from 2006 to 2012 were identified. For each injured player, a matched control player was identified. Demographic and performance statistics were collected from the online NFL player database. Players who returned after ACLR (n = 23) were compared with players who did not return (n = 15) using t tests and chi-squared analyses. Similarly, players who returned after ACLR (n = 23) were compared with their matched controls with t tests and chi-squared analyses. Two-way repeated-measures analysis of variance was utilized to test for significant differences between performance before and after the season of the injury for the players in the ACLR group who returned (n = 23) and for their matched controls. Kaplan-Meier analysis was performed to test for differences in the rate of retirement between the groups. For all analyses, P values <.05 were considered significant. RESULTS: Approximately 74% (28/38) of athletes who underwent ACLR returned to play at least 1 NFL game, and 61% (23/38) successfully returned to play at least half a season (ie, 8 games). Athletes in the ACLR group who returned retired from the NFL significantly sooner and more often after surgery than their matched controls. In the seasons leading up to their injury, athletes who successfully returned to play started a greater percentage of their games (81%) and made more solo tackles per game (3.44 ± 1.47) compared with athletes in the ACLR group who did not return to play (54% and 1.82 ± 1.17, respectively) and compared with healthy control players (52% and 1.77 ± 1.19, respectively). After the season of surgery, athletes in the ACLR group who returned to play decreased to 57% games started and 2.38 ± 1.24 solo tackles per game, while their matched controls suffered no significant decreases. CONCLUSION: Players who successfully returned were above-average NFL players before their injury but comparatively average after their return.


Asunto(s)
Lesiones del Ligamento Cruzado Anterior/cirugía , Reconstrucción del Ligamento Cruzado Anterior/estadística & datos numéricos , Rendimiento Atlético/estadística & datos numéricos , Fútbol Americano/estadística & datos numéricos , Volver al Deporte/estadística & datos numéricos , Adulto , Estudios de Casos y Controles , Estudios de Cohortes , Fútbol Americano/lesiones , Humanos , Estimación de Kaplan-Meier , Masculino , Estados Unidos , Adulto Joven
15.
Sports Health ; 9(1): 52-58, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-27760844

RESUMEN

BACKGROUND: Extreme conditioning programs (ECPs) are fitness training regimens relying on aerobic, plyometric, and resistance training exercises, often with high levels of intensity for a short duration of time. These programs have grown rapidly in popularity in recent years, but science describing the safety profile of these programs is lacking. HYPOTHESIS: The rate of injury in the extreme conditioning program is greater than the injury rate of weightlifting and the majority of injuries occur to the shoulder and back. STUDY DESIGN: Cross-sectional study. LEVEL OF EVIDENCE: Level 4. METHODS: This is a retrospective survey of injuries reported by athletes participating in an ECP. An injury survey was sent to 1100 members of Iron Tribe Fitness, a gym franchise with 5 locations across Birmingham, Alabama, that employs exercises consistent with an ECP in this study. An injury was defined as a physical condition resulting from ECP participation that caused the athlete to either seek medical treatment, take time off from exercising, or make modifications to his or her technique to continue. RESULTS: A total of 247 athletes (22%) completed the survey. The majority (57%) of athletes were male (n = 139), and 94% of athletes were white (n = 227). The mean age of athletes was 38.9 years (±8.9 years). Athletes reported participation in the ECP for, on average, 3.6 hours per week (± 1.2 hours). Eighty-five athletes (34%) reported that they had sustained an injury while participating in the ECP. A total of 132 injuries were recorded, yielding an estimated incidence of 2.71 per 1000 hours. The shoulder or upper arm was the most commonly injured body site, accounting for 38 injuries (15% of athletes). Athletes with a previous shoulder injury were 8.1 times as likely to injure their shoulder in the ECP compared with athletes with healthy shoulders. The trunk, back, head, or neck (n = 29, 12%) and the leg or knee (n = 29, 12%) were the second most commonly injured sites. The injury incidence rate among athletes with <6 months of experience in the ECP was 2.5 times greater than that of more experienced athletes (≥6 months of experience). Of the 132 injuries, 23 (17%) required surgical intervention. Squat cleans, ring dips, overhead squats, and push presses were more likely to cause injury. Athletes reported that 35% of injuries were due to overexertion and 20% were due to improper technique. CONCLUSION: The estimated injury rate among athletes participating in this ECP was similar to the rate of injury in weightlifting and most other recreational activities. The shoulder or upper arm was the most commonly injured area, and previous shoulder injury predisposed to new shoulder injury. New athletes are at considerable risk of injury compared with more experienced athletes. CLINICAL RELEVANCE: Extreme conditioning programs are growing in popularity, and there is disagreement between science and anecdotal reports from athletes, coaches, and physicians about their relative safety. This study estimates the incidence of injury in extreme conditioning programs, which appears to be similar to other weight-training programs.

16.
Sports Health ; 9(3): 210-215, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-27872403

RESUMEN

BACKGROUND: Weighted-ball throwing programs are commonly used in training baseball pitchers to increase ball velocity. The purpose of this study was to compare kinematics and kinetics among weighted-ball exercises with values from standard pitching (ie, pitching standard 5-oz baseballs from a mound). HYPOTHESIS: Ball and arm velocities would be greater with lighter balls and joint kinetics would be greater with heavier balls. STUDY DESIGN: Controlled laboratory study. METHODS: Twenty-five high school and collegiate baseball pitchers experienced with weighted-ball throwing were tested with an automated motion capture system. Each participant performed 3 trials of 10 different exercises: pitching 4-, 5-, 6-, and 7-oz baseballs from a mound; flat-ground crow hop throws with 4-, 5-, 6-, and 7-oz baseballs; and flat-ground hold exercises with 14- and 32-oz balls. Twenty-six biomechanical parameters were computed for each trial. Data among the 10 exercises were compared with repeated measures analysis of variance and post hoc paired t tests against the standard pitching data. RESULTS: Ball velocity increased as ball mass decreased. There were no differences in arm and trunk velocities between throwing a standard baseball and an underweight baseball (4 oz), while arm and trunk velocities steadily decreased as ball weight increased from 5 to 32 oz. Compared with values pitching from a mound, velocities of the pelvis, shoulder, and ball were increased for flat-ground throws. In general, as ball mass increased arm torques and forces decreased; the exception was elbow flexion torque, which was significantly greater for the flat-ground holds. There were significant differences in body positions when pitching on the mound, flat-ground throws, and holds. CONCLUSIONS: While ball velocity was greatest throwing underweight baseballs, results from the study did not support the rest of the hypothesis. Kinematics and kinetics were similar between underweight and standard baseballs, while overweight balls correlated with decreased arm forces, torques, and velocities. Increased ball velocity and joint velocities were produced with crow hop throws, likely because of running forward while throwing. CLINICAL RELEVANCE: As pitching slightly underweight and overweight baseballs produces variations in kinematics without increased arm kinetics, these exercises seem reasonable for training pitchers. As flat-ground throwing produces increased shoulder internal rotation velocity and elbow varus torque, these exercises may be beneficial but may also be stressful and risky. Flat-ground holds with heavy balls should not be viewed as enhancing pitching biomechanics, but rather as hybrid exercises between throwing and resistance training.


Asunto(s)
Béisbol/fisiología , Extremidad Inferior/fisiología , Extremidad Superior/fisiología , Brazo/fisiología , Béisbol/lesiones , Fenómenos Biomecánicos , Codo/fisiología , Electromiografía , Pie/fisiología , Humanos , Rodilla/fisiología , Pelvis/fisiología , Factores de Riesgo , Hombro/fisiología , Estudios de Tiempo y Movimiento , Torso/fisiología
17.
Orthop J Sports Med ; 4(10): 2325967116668138, 2016 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-27790624

RESUMEN

BACKGROUND: Tennis-teaching professionals represent a significant proportion of all avid tennis players worldwide, with 15,000 belonging to the largest professional organization, the United States Professional Tennis Association (USPTA). However, there is no epidemiologic study to date reporting the prevalence of musculoskeletal conditions in these tennis-teaching professionals. PURPOSE: To investigate the prevalence of musculoskeletal conditions in tennis-teaching professionals following the International Tennis Federation's (ITF) guidelines for epidemiologic studies. STUDY DESIGN: Descriptive epidemiology study. METHODS: Electronic surveys were distributed to 13,500 American members of the USPTA. The prevalence of musculoskeletal conditions was calculated. RESULTS: A total of 1176 USPTA members completed the survey. Most participants reported teaching more than 5 days per week and more than 2 hours per day. The prevalence of musculoskeletal injury secondary to teaching tennis was 42%. The most affected area was the lower extremities (43% of all injuries) followed by the upper extremities (37%). The most commonly injured structures were muscles or tendons (36% of all injuries) and joints or ligaments (28%). The majority of injuries did not cause participants to miss more than 24 hours of teaching (57%). CONCLUSION: This is the first epidemiologic study on the occupational risk of musculoskeletal injuries and conditions in tennis-teaching professionals. Tennis-teaching professionals have a significant risk of musculoskeletal injuries or conditions related to their occupation. The prevalence of injury is consistent with previously published studies of injury prevalence among other tennis-playing populations. The proportions of upper and lower extremity injuries were fairly equitable.

18.
Arthroscopy ; 32(11): 2278-2284, 2016 11.
Artículo en Inglés | MEDLINE | ID: mdl-27160462

RESUMEN

PURPOSE: To determine common mechanisms of anterior cruciate ligament (ACL) injury in baseball players and to quantify the rate of return to play after primary surgical reconstruction and review intermediate clinical outcomes. METHODS: Surgical injuries involving the ACL in youth, high school, collegiate, and professional baseball players were queried for an 11-year period (2001 to 2011). Over the study period, 42 baseball players were identified who had undergone arthroscopically assisted primary ACL reconstruction by 1 of 3 attending surgeons. Retrospective chart review was performed for all 42 patients to evaluate variables of age, level of competition, position, mechanism of injury, graft choice, and associated meniscal injuries. Twenty-six patients were reached for telephone survey and International Knee Documentation Committee questionnaire and they answered questions about their original injury and playing history. RESULTS: The most common mechanism of injury was fielding, followed by base running. Infielders and outfielders (32% each) were the most commonly injured position, followed by pitchers (29%). Among the 32 players for whom it could be determined, 30 (94%) were able to return to playing baseball at a mean follow-up of 4.2 years (range 1.0 to 9.9 years). The mean International Knee Documentation Committee score was 84.0 (range 63 to 91). Among the 26 patients contacted for telephone interview, no one required revision ACL surgery, but 3 required a subsequent procedure for meniscal tear. Twenty-five patients (96%) denied any episodes of instability in the knee after reconstruction. CONCLUSIONS: The overwhelming majority of baseball players that sustain ACL injuries do so while fielding or base running. Outfielders are significantly more likely than infielders to suffer ACL injuries while fielding versus base running. The results with respect to return to play are promising, as nearly all patients were able to return to baseball and none required a revision ACL surgery at a mean follow-up of 4.2 years. LEVEL OF EVIDENCE: Level IV, therapeutic case series.


Asunto(s)
Lesiones del Ligamento Cruzado Anterior/etiología , Béisbol/lesiones , Volver al Deporte/estadística & datos numéricos , Adolescente , Adulto , Lesiones del Ligamento Cruzado Anterior/cirugía , Reconstrucción del Ligamento Cruzado Anterior , Humanos , Masculino , Estudios Retrospectivos , Adulto Joven
19.
Sports Biomech ; 15(2): 128-38, 2016 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-27110899

RESUMEN

Controversy continues whether curveballs are stressful for young baseball pitchers. Furthermore, it is unproven whether professional baseball pitchers have fewer kinematic differences between fastballs and off-speed pitches than lower level pitchers. Kinematic and kinetic data were measured for 111 healthy baseball pitchers (26 youth, 21 high school, 20 collegiate, 26 minor league, and 18 major league level) throwing fastballs, curveballs, and change-ups in an indoor biomechanics laboratory with a high-speed, automated digitising system. Differences between pitch types and between competition levels were analysed with repeated measures ANOVA. Shoulder and elbow kinetics were greater in fastballs than in change-ups, while curveball kinetics were not different from the other two types of pitches. Kinematic angles at the instant of ball release varied between pitch types, while kinematic angles at the instant of lead foot contact varied between competition levels. There were no significant interactions between pitch type and competition level, meaning that kinetic and kinematic differences between pitch types did not vary by competition level. Like previous investigations, this study did not support the theory that curveballs are relatively more stressful for young pitchers. Although pitchers desire consistent kinematics, there were differences between pitch types, independent of competition level.


Asunto(s)
Béisbol/fisiología , Conducta Competitiva/fisiología , Codo/fisiología , Destreza Motora/fisiología , Hombro/fisiología , Adolescente , Factores de Edad , Fenómenos Biomecánicos , Niño , Humanos , Rango del Movimiento Articular , Adulto Joven
20.
Sports Biomech ; 15(1): 36-47, 2016.
Artículo en Inglés | MEDLINE | ID: mdl-26836969

RESUMEN

Swing trajectory and ground reaction forces (GRF) of 30 collegiate baseball batters hitting a pitched ball were compared between a standard bat, a bat with extra weight about its barrel, and a bat with extra weight in its handle. It was hypothesised that when compared to a standard bat, only a handle-weighted bat would produce equivalent bat kinematics. It was also hypothesised that hitters would not produce equivalent GRFs for each weighted bat, but would maintain equivalent timing when compared to a standard bat. Data were collected utilising a 500 Hz motion capture system and 1,000 Hz force plate system. Data between bats were considered equivalent when the 95% confidence interval of the difference was contained entirely within ±5% of the standard bat mean value. The handle-weighted bat had equivalent kinematics, whereas the barrel-weighted bat did not. Both weighted bats had equivalent peak GRF variables. Neither weighted bat maintained equivalence in the timing of bat kinematics and some peak GRFs. The ability to maintain swing kinematics with a handle-weighted bat may have implications for swing training and warm-up. However, altered timings of kinematics and kinetics require further research to understand the implications on returning to a conventionally weighted bat.


Asunto(s)
Béisbol/fisiología , Equipo Deportivo , Brazo/fisiología , Fenómenos Biomecánicos , Pie/fisiología , Humanos , Masculino , Torso/fisiología , Adulto Joven
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA
...